Feature Selection Based on Linear Separability and a Cpl Criterion Function

نویسندگان

  • LEON BOBROWSKI
  • L. Bobrowski
چکیده

Linear separability of data sets is one of the basic concepts in the theory of neural networks and pattern recognition. Data sets are often linearly separable because of their high dimensionality. Such is the case of genomic data, in which a small number of cases is represented in a space with extremely high dimensionality. An evaluation of linear separability of two data sets can be combined with feature selection and carried out through minimisation of a convex and piecewise-linear (CPL) criterion function. The perceptron criterion function belongs to the CPL family. The basis exchange algorithms allow us to find minimal values of CPL functions efficiently, even in the case of large, multidimensional data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Selection of the Linearly Separable Feature Subsets

We address a situation when more than one feature subset allows for linear separability of given data sets. Such situation can occur if a small number of cases is represented in a highly dimensional feature space. The method of the feature selection based on minimisation of a special criterion function is here analysed. This criterion function is convex and piecewise-linear (CPL). The proposed ...

متن کامل

6 Relaxed Linear Separability ( RLS ) Approach to Feature ( Gene ) Subset Selection

Feature selection is one of active research area in pattern recognition or data mining methods (Duda et al., 2001). The importance of feature selection methods becomes apparent in the context of rapidly growing amount of data collected in contemporary databases (Liu & Motoda, 2008). Feature subset selection procedures are aimed at neglecting as large as possible number of such features (measure...

متن کامل

Ranked Modelling with Feature Selection Based on the CPL Criterion Functions

Ranked transformations should preserve a priori given ranked relations (order) between some feature vectors. Designing ranked models includes feature selection tasks. Components of feature vectors which are not important for preserving the vectors order should be neglected. This way unimportant dimensions are greatly reduced in the feature space. It is particularly important in the case of “lon...

متن کامل

Feature Selection Based on Information Theory, Consistency and Separability Indices

Two new feature selection methods are introduced, the first based on separability criterion, the second on consistency index that includes interactions between the selected subsets of features. Comparison of accuracy was made against information-theory based selection methods on several datasets training neurofuzzy and nearest neighbor methods on various subsets of selected features. Methods ba...

متن کامل

A class discriminality measure based on feature space partitioning

-This paper presents a new class discriminability measure based on an adaptive partitioning of the feature space according to the available class samples. It is intended to be used as a criterion in a classifier-independent feature selection procedure. The partitioning is performed according to a binary splitting rule and appropriate stopping criteria. Results from several tests with Gaussian a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004